The Variance of the Mutual Information Estimator
نویسندگان
چکیده
In the case of two signals with independent pairs of observations (xn; yn) a statistic to estimate the variance of the mutual information estimator has been derived earlier. We present such a statistic for dependent pairs. To derive this statistic it is necessary to avail of a reliable statistic to estimate the variance of the sample mean in case of dependent observations. We derive and discuss this statistic and a statistic to estimate the variance of the mutual information estimator. These statistics are veriied by simulations.
منابع مشابه
A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations
In the case of two signals with independent pairs of observations (x n ,y n ) a statistic to estimate the variance of the histogram based mutual information estimator has been derived earlier. We present such a statistic for dependent pairs. To derive this statistic it is necessary to avail of a reliable statistic to estimate the variance of the sample mean in case of dependent observations. We...
متن کاملEstimating a Bounded Normal Mean Relative to Squared Error Loss Function
Let be a random sample from a normal distribution with unknown mean and known variance The usual estimator of the mean, i.e., sample mean is the maximum likelihood estimator which under squared error loss function is minimax and admissible estimator. In many practical situations, is known in advance to lie in an interval, say for some In this case, the maximum likelihood estimator...
متن کاملIMPROVED ESTIMATOR OF THE VARIANCE IN THE LINEAR MODEL
The improved estimator of the variance in the general linear model is presented under an asymmetric linex loss function.
متن کاملThe Structure of Bhattacharyya Matrix in Natural Exponential Family and Its Role in Approximating the Variance of a Statistics
In most situations the best estimator of a function of the parameter exists, but sometimes it has a complex form and we cannot compute its variance explicitly. Therefore, a lower bound for the variance of an estimator is one of the fundamentals in the estimation theory, because it gives us an idea about the accuracy of an estimator. It is well-known in statistical inference that the Cram&eac...
متن کاملAnalysis of k-Nearest Neighbor Distances with Application to Entropy Estimation
Estimating entropy and mutual information consistently is important for many machine learning applications. The Kozachenko-Leonenko (KL) estimator (Kozachenko & Leonenko, 1987) is a widely used nonparametric estimator for the entropy of multivariate continuous random variables, as well as the basis of the mutual information estimator of Kraskov et al. (2004), perhaps the most widely used estima...
متن کامل